Graph Structure Learning with Variational Information Bottleneck

نویسندگان

چکیده

Graph Neural Networks (GNNs) have shown promising results on a broad spectrum of applications. Most empirical studies GNNs directly take the observed graph as input, assuming structure perfectly depicts accurate and complete relations between nodes. However, graphs in real-world are inevitably noisy or incomplete, which could even exacerbate quality representations. In this work, we propose novel Variational Information Bottleneck guided Structure Learning framework, namely VIB-GSL, perspective information theory. VIB-GSL is first attempt to advance (IB) principle for learning, providing more elegant universal framework mining underlying task-relevant relations. learns an informative compressive distill actionable specific downstream tasks. deduces variational approximation irregular data form tractable IB objective function, facilitates training stability. Extensive experimental demonstrate that superior effectiveness robustness proposed VIB-GSL.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Variational Information Bottleneck

We present a variational approximation to the information bottleneck of Tishby et al. (1999). This variational approach allows us to parameterize the information bottleneck model using a neural network and leverage the reparameterization trick for efficient training. We call this method “Deep Variational Information Bottleneck”, or Deep VIB. We show that models trained with the VIB objective ou...

متن کامل

Relevant sparse codes with variational information bottleneck

In many applications, it is desirable to extract only the relevant aspects of data. A principled way to do this is the information bottleneck (IB) method, where one seeks a code that maximizes information about a ‘relevance’ variable, Y , while constraining the information encoded about the original data, X . Unfortunately however, the IB method is computationally demanding when data are high-d...

متن کامل

Learning and Generalization with the Information Bottleneck

The Information Bottleneck is an information theoretic framework that finds concise representations for an ‘input’ random variable that are as relevant as possible for an ‘output’ random variable. This framework has been used successfully in various supervised and unsupervised applications. However, its learning theoretic properties and justification remained unclear as it differs from standard...

متن کامل

Compressing Neural Networks using the Variational Information Bottleneck

Neural networks can be compressed to reduce memory and computational requirements, or to increase accuracy by facilitating the use of a larger base architecture. In this paper we focus on pruning individual neurons, which can simultaneously trim model size, FLOPs, and run-time memory. To improve upon the performance of existing compression algorithms we utilize the information bottleneck princi...

متن کامل

The Mathematical Structure of Information Bottleneck Methods

Information Bottleneck-based methods use mutual information as a distortion function in order to extract relevant details about the structure of a complex system by compression. One of the approaches used to generate optimal compressed representations is by annealing a parameter. In this manuscript we present a common framework for the study of annealing in information distortion problems. We i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2022

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v36i4.20335